30 research outputs found
Resurrecting the Tromba Marina:A Bowed Virtual Reality Instrument using Haptic Feedback and Accurate Physical Modelling
This paper proposes a multisensory simulation of a tromba marina – a bowed string instrument in virtual reality. The auditory feedback is generated by an accurate physical model, the haptic feedback is provided by the PHANTOM Omni, and the visual feedback is rendered through an Oculus Rift CV1 head-mounted display (HMD). Moreover, a user study exploring the experience of interacting with a virtual bowed string instrument is presented, as well as evaluating the playability of the system. The study comprises of both qualitative (observations, think aloud and interviews) and quantitative (survey) data collection methods. The results indicate that the implementation was successful, offering participants realistic feedback, as well as a satisfactory multisensory experience, allowing them to use the system as a musical instrument
Real-time implementation of the shamisen using finite difference schemes
The shamisen is a Japanese three-stringed lute. It is a chordophone that has the front of the body covered by a tensioned membrane which greatly contributes to the distinct sound of the instrument. Although the shamisen is a traditional Japanese instrument, it is a rare instrument in the rest of the world, making it mostly inaccessible by the majority of artists. To our knowledge, no physically modelled synthesizer of the shamisen is available, forcing producers and musicians to use samples. The objective of this paper is to make the shamisen’s distinct sound more accessible to digital music artists. The real-time implementation of the shamisen physical model is presented along with the derivation of solution using the finite-difference timedomain (FDTD) methods. The digital instrument sounds mostly as intended, though lacking the shamisen’s distinct buzzing sound requiring further development
DigiDrum:A Haptic-based Virtual Reality Musical Instrument and a Case Study
This paper presents DigiDrum – a novel virtual reality musical
instrument (VRMI) which consists of a physical drum
augmented by virtual reality (VR) to produce enhanced
auditory and haptic feedback. The physical drum membrane
is driven by a simulated membrane of which the parameters
can be changed on the fly. The design and implementation
of the instrument setup are detailed together
with the preliminary results of a user study which investigates
users’ haptic perception of the material stiffness of
the drum membrane. The study tests whether the tension
in the membrane simulation and the sound damping (how
fast the sound dies out) changes users’ perception of drum
membrane stiffness. Preliminary results show that higher
values for both tension and damping give the illusion of
higher material stiffness in the drum membrane, where the
damping appears to be the more important factor. The goal
and contribution of this work is twofold: on the one hand it
introduces a musical instrument which allows for enhanced
musical expression possibilities through VR. On the other
hand, it presents an early investigation on how haptics influence
users’ interaction in VRMIs by presenting a preliminary
study
Implementing physical models in real-time using partitioned convolution: An adjustable spring reverb
Applied Physical Modeling for Sound Synthesis: The Yaybahar
In this paper, finite-difference time-domain methods are adopted to model a specific instrument, the Yaybahar, invented by Turkish artist Görkem Şen. Each part of the instrument is simulated independently and its physical behavior is explained in an intuitive yet accurate manner. The models are implemented in C++ to form an interactive, real-time application. Code and sound samples are available online
The Dynamic Grid:Time-varying parameters for musical instrument simulations based on finite-difference time-domain schemes
Several well-established approaches to physical modeling synthesis for musical instruments exist. Finite-difference time-domain methods are known for their generality and flexibility in terms of the systems one can model but are less flexible with regard to smooth parameter variations due to their reliance on a static grid. This paper presents the dynamic grid, a method to smoothly change grid configurations of finite-difference time-domain schemes based on sub-audio–rate time variation of parameters. This allows for extensions of the behavior of physical models beyond the physically possible, broadening the range of expressive possibilities for the musician. The method is applied to the 1D wave equation, the stiff string, and 2D systems, including the 2D wave equation and thin plate. Results show that the method does not introduce noticeable artefacts when changing between grid configurations for systems, including loss.</p
Linearly-implicit schemes for collisions in musical acoustics based on energy quadratisation
Collision modelling represents an active field of research in musical acoustics. Common examples of collisions include the hammer-string interaction in the piano, the interaction of strings with fretboards and fingers, the membrane-wire interaction in the snare drum, reed-beating effects in wind instruments, and others. At the modelling level, many current approaches make use of conservative potentials in the form of power-laws, and discretisations proposed for such models rely in all cases on iterative root-finding routines. Here, a method based on energy quadratisation of the nonlinear collision potential is proposed. It is shown that there exist a suitable discretisation of such model that may be resolved in a single iteration, whilst guaranteeing stability via energy conservation. Applications to the case of lumped as well as fully distributed systems will be given, using both finite-difference and modal methods